248 research outputs found

    A customizable, low-cost optomotor apparatus: a powerful tool for behaviourally measuring visual capability

    Get PDF
    This is the final version. Available on open access from Wiley via the DOI in this recordData Accessibility: A full parts list, 3D models, assembly instructions, and microcontroller code are provided under Creative Commons license at the GitHub repository (https://github.com/troscianko/optomotor) (DOI: 10.5281/zenodo.3840063) and as supplementary material. Construction guide, video guide, user forum, and future updates are provided here: http://www.empiricalimaging.com/optomotor/.1. Vision is the dominant sense for many animals, and there is an enormous diversity in visual capabilities. Understanding the visual abilities of a given species can therefore be key for investigating its behaviour and evolution. However, many techniques for quantifying visual capability are expensive, require specialized equipment, or are terminal for the animal. 2. Here, we discuss how to measure the optomotor (or optokinetic) response, an innate response that can be elicited without any training in a wide range of taxa, and which is quantifiable, accessible, and non-invasive, and provide guidance for carrying out optomotor experiments. 3. We provide instructions for building a customizable, programmable optomotor apparatus using 3D-printed and low-cost materials, discuss experimental design considerations for optomotor assays, include a guide that calculates the dimensions of stimuli of varying spatial frequency, and provide a table summarizing experimental parameters in prior optomotor experiments across a range of species. 4. Ultimately, making this simple technique more accessible will allow more researchers to incorporate measures of visual capability into their work. Additionally, the low cost and ease of construction of our apparatus will allow educators in a variety of settings to include optomotor assays in classroom activities or demonstrations. Although here we focus on using optomotor to measure visual acuity—the ability to perceive detail—the apparatus and stimuli described here can be adapted to measure visual capabilities including spectral, contrast, and polarization sensitivity, as well as motion detection, among others.European Union Horizon 2020Natural Environment Research Council (NERC)Royal Societ

    Multi-camera real-time three-dimensional tracking of multiple flying animals

    Get PDF
    Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in real time—with minimal latency—opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behaviour. Here, we describe a system capable of tracking the three-dimensional position and body orientation of animals such as flies and birds. The system operates with less than 40 ms latency and can track multiple animals simultaneously. To achieve these results, a multi-target tracking algorithm was developed based on the extended Kalman filter and the nearest neighbour standard filter data association algorithm. In one implementation, an 11-camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behaviour of freely flying animals. If combined with other techniques, such as ‘virtual reality’-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals

    Bio-inspired motion detection in an FPGA-based smart camera module

    Get PDF
    Köhler T, Roechter F, Lindemann JP, Möller R. Bio-inspired motion detection in an FPGA-based smart camera module. Bioinspiration & Biomimetics. 2009;4(1):015008.Flying insects, despite their relatively coarse vision and tiny nervous system, are capable of carrying out elegant and fast aerial manoeuvres. Studies of the fly visual system have shown that this is accomplished by the integration of signals from a large number of elementary motion detectors (EMDs) in just a few global flow detector cells. We developed an FPGA-based smart camera module with more than 10000 single EMDs, which is closely modelled after insect motion-detection circuits with respect to overall architecture, resolution and inter-receptor spacing. Input to the EMD array is provided by a CMOS camera with a high frame rate. Designed as an adaptable solution for different engineering applications and as a testbed for biological models, the EMD detector type and parameters such as the EMD time constants, the motion-detection directions and the angle between correlated receptors are reconfigurable online. This allows a flexible and simultaneous detection of complex motion fields such as translation, rotation and looming, such that various tasks, e. g., obstacle avoidance, height/distance control or speed regulation can be performed by the same compact device

    Coding Efficiency of Fly Motion Processing Is Set by Firing Rate, Not Firing Precision

    Get PDF
    To comprehend the principles underlying sensory information processing, it is important to understand how the nervous system deals with various sources of perturbation. Here, we analyze how the representation of motion information in the fly's nervous system changes with temperature and luminance. Although these two environmental variables have a considerable impact on the fly's nervous system, they do not impede the fly to behave suitably over a wide range of conditions. We recorded responses from a motion-sensitive neuron, the H1-cell, to a time-varying stimulus at many different combinations of temperature and luminance. We found that the mean firing rate, but not firing precision, changes with temperature, while both were affected by mean luminance. Because we also found that information rate and coding efficiency are mainly set by the mean firing rate, our results suggest that, in the face of environmental perturbations, the coding efficiency is improved by an increase in the mean firing rate, rather than by an increased firing precision

    A Model for Detection of Angular Velocity of Image Motion Based on the Temporal Tuning of the Drosophila

    Get PDF
    We propose a new bio-plausible model based on the visual systems of Drosophila for estimating angular velocity of image motion in insects’ eyes. The model implements both preferred direction motion enhancement and non-preferred direction motion suppression which is discovered in Drosophila’s visual neural circuits recently to give a stronger directional selectivity. In addition, the angular velocity detecting model (AVDM) produces a response largely independent of the spatial frequency in grating experiments which enables insects to estimate the flight speed in cluttered environments. This also coincides with the behaviour experiments of honeybee flying through tunnels with stripes of different spatial frequencies

    Visually Guided Avoidance in the Chameleon (Chamaeleo chameleon): Response Patterns and Lateralization

    Get PDF
    The common chameleon, Chamaeleo chameleon, is an arboreal lizard with highly independent, large-amplitude eye movements. In response to a moving threat, a chameleon on a perch responds with distinct avoidance movements that are expressed in its continuous positioning on the side of the perch distal to the threat. We analyzed body-exposure patterns during threat avoidance for evidence of lateralization, that is, asymmetry at the functional/behavioral levels. Chameleons were exposed to a threat approaching horizontally from the left or right, as they held onto a vertical pole that was either wider or narrower than the width of their head, providing, respectively, monocular or binocular viewing of the threat. We found two equal-sized sub-groups, each displaying lateralization of motor responses to a given direction of stimulus approach. Such an anti-symmetrical distribution of lateralization in a population may be indicative of situations in which organisms are regularly exposed to crucial stimuli from all spatial directions. This is because a bimodal distribution of responses to threat in a natural population will reduce the spatial advantage of predators

    Constant Angular Velocity Regulation for Visually Guided Terrain Following

    Get PDF
    Insects use visual cues to control their flight behaviours. By estimating the angular velocity of the visual stimuli and regulating it to a constant value, honeybees can perform a terrain following task which keeps the certain height above the undulated ground. For mimicking this behaviour in a bio-plausible computation structure, this paper presents a new angular velocity decoding model based on the honeybee's behavioural experiments. The model consists of three parts, the texture estimation layer for spatial information extraction, the motion detection layer for temporal information extraction and the decoding layer combining information from pervious layers to estimate the angular velocity. Compared to previous methods on this field, the proposed model produces responses largely independent of the spatial frequency and contrast in grating experiments. The angular velocity based control scheme is proposed to implement the model into a bee simulated by the game engine Unity. The perfect terrain following above patterned ground and successfully flying over irregular textured terrain show its potential for micro unmanned aerial vehicles' terrain following

    Long-term results of radiotherapy for periarthritis of the shoulder: a retrospective evaluation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To evaluate retrospectively the results of radiotherapy for periarthritis of the shoulder</p> <p>Methods</p> <p>In 1983–2004, 141 patients were treated, all had attended at least one follow-up examination. 19% had had pain for several weeks, 66% for months and 14% for years. Shoulder motility was impaired in 137/140 patients. Nearly all patients had taken oral analgesics, 81% had undergone physiotherapy, five patients had been operated on, and six had been irradiated. Radiotherapy was applied using regular anterior-posterior opposing portals and Co-60 gamma rays or 4 MV photons. 89% of the patients received a total dose of 6 Gy (dose/fraction of 1 Gy twice weekly, the others had total doses ranging from 4 to 8 Gy. The patients and the referring doctors were given written questionnaires in order to obtain long-term results. The mean duration of follow-up was 6.9 years [0–20 years].</p> <p>Results</p> <p>During the first follow-up examination at the end of radiotherapy 56% of the patients reported pain relief and improvement of motility. After in median 4.5 months the values were 69 and 89%, after 3.9 years 73% and 73%, respectively. There were virtually no side effects. In the questionnaires, 69% of the patients reported pain relief directly after radiotherapy, 31% up to 12 weeks after radiotherapy. 56% of the patients stated that pain relief had lasted for "years", in further 12% at least for "months".</p> <p>Conclusion</p> <p>Low-dose radiotherapy for periarthropathy of the shoulder was highly effective and yielded long-lasting improvement of pain and motility without side effects.</p

    A Model for the Detection of Moving Targets in Visual Clutter Inspired by Insect Physiology

    Get PDF
    We present a computational model for target discrimination based on intracellular recordings from neurons in the fly visual system. Determining how insects detect and track small moving features, often against cluttered moving backgrounds, is an intriguing challenge, both from a physiological and a computational perspective. Previous research has characterized higher-order neurons within the fly brain, known as ‘small target motion detectors’ (STMD), that respond robustly to moving features, even when the velocity of the target is matched to the background (i.e. with no relative motion cues). We recorded from intermediate-order neurons in the fly visual system that are well suited as a component along the target detection pathway. This full-wave rectifying, transient cell (RTC) reveals independent adaptation to luminance changes of opposite signs (suggesting separate ON and OFF channels) and fast adaptive temporal mechanisms, similar to other cell types previously described. From this physiological data we have created a numerical model for target discrimination. This model includes nonlinear filtering based on the fly optics, the photoreceptors, the 1st order interneurons (Large Monopolar Cells), and the newly derived parameters for the RTC. We show that our RTC-based target detection model is well matched to properties described for the STMDs, such as contrast sensitivity, height tuning and velocity tuning. The model output shows that the spatiotemporal profile of small targets is sufficiently rare within natural scene imagery to allow our highly nonlinear ‘matched filter’ to successfully detect most targets from the background. Importantly, this model can explain this type of feature discrimination without the need for relative motion cues
    corecore